Tensor Decompositions with Banded Matrix Factors

نویسندگان

  • Mikael Sorensen
  • Pierre Comon
  • Mikael Sørensen
چکیده

The computation of themodel parameters of a Canonical Polyadic Decomposition (CPD), also known as the parallel factor (PARAFAC) or canonical decomposition (CANDECOMP) or CP decomposition, is typically done by resorting to iterative algorithms, e.g. either iterative alternating least squares type or descent methods. In many practical problems involving tensor decompositions such as signal processing, some of thematrix factors are banded. First, we develop methods for the computation of CPDs with one banded matrix factor. It results in best rank-1 tensor approximation problems. Second, we propose methods to compute CPDs with more than one bandedmatrix factor. Third, we extend the developedmethods to also handle banded and structured matrix factors such as Hankel or Toeplitz. Computer results are also reported.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tensor-Train Ranks for Matrices and Their Inverses

We show that the recent tensor-train (TT) decompositions of matrices come up from its recursive Kronecker-product representations with a systematic use of common bases. The names TTM and QTT used in this case stress the relation with multilevel matrices or quantization that increases artificially the number of levels. Then we investigate how the tensor-train ranks of a matrix can be related to ...

متن کامل

The geometry of rank decompositions of matrix multiplication II: $3\times 3$ matrices

This is the second in a series of papers on rank decompositions of the matrix multiplication tensor. We present new rank $23$ decompositions for the $3\times 3$ matrix multiplication tensor $M_{\langle 3\rangle}$. All our decompositions have symmetry groups that include the standard cyclic permutation of factors but otherwise exhibit a range of behavior. One of them has 11 cubes as summands and...

متن کامل

Tensor Decompositions: A New Concept in Brain Data Analysis?

Matrix factorizations and their extensions to tensor factorizations and decompositions have become prominent techniques for linear and multilinear blind source separation (BSS), especially multiway Independent Component Analysis (ICA), Nonnegative Matrix and Tensor Factorization (NMF/NTF), Smooth Component Analysis (SmoCA) and Sparse Component Analysis (SCA). Moreover, tensor decompositions hav...

متن کامل

Banded Householder representation of linear subspaces

We show how to compactly represent any n-dimensional subspace of R as a banded product of Householder reflections using n(m − n) floating point numbers. This is optimal since these subspaces form a Grassmannian space Grn(m) of dimension n(m− n). The representation is stable and easy to compute: any matrix can be factored into the product of a banded Householder matrix and a square matrix using ...

متن کامل

Better Late Than Never: Filling a Void in the History of Fast Matrix Multiplication and Tensor Decompositions

Multilinear and tensor decompositions are a popular tool in linear and multilinear algebra and have a wide range of important applications to modern computing. Our paper of 1972 presented the first nontrivial application of such decompositions to fundamental matrix computations and was also a landmark in the history of the acceleration of matrix multiplication. Published in 1972 in Russian, it ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017